-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Handle multiple providers/llms #45
Conversation
# AWS_ACCESS_KEY_ID= | ||
# AWS_SECRET_ACCESS_KEY= |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Required for AWS Bedrock models
"ignore", | ||
module=".*(pydantic).*", | ||
category=UserWarning, | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pydantic warnings coming from ChatLiteLLM class
**(self.proxy.get_api_kwargs() if self.proxy else {}), | ||
**(self.proxy.get_additional_kwargs(self) if self.proxy else {}), | ||
**kwargs, | ||
) | ||
if self.temperature: | ||
if self.temperature and self.model not in settings.AI_UNSUPPORTED_TEMP_MODELS: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The new o3-mini
model does not support the temperature parameter and will raise an exception if it is passed along
agent_graph.add_conditional_edges( | ||
agent_node, | ||
continue_on_tool_call, | ||
{ | ||
# If tool requested then we call the tool node. | ||
CONTINUE: tools_node, | ||
# Otherwise finish. | ||
END: END, | ||
}, | ||
) | ||
agent_graph.add_conditional_edges(agent_node, tools_condition) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This example was probably more verbose than it needed to be, the built in tools_condition
function should suffice for most use cases.
@@ -447,7 +421,7 @@ def __init__( # noqa: PLR0913 | |||
super().__init__( | |||
user_id, | |||
name=name, | |||
model=model or settings.AI_MODEL, | |||
model=model or settings.AI_DEFAULT_SYLLABUS_MODEL, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
After AWS Bedrock access is set up, the default syllabus model should be changed to bedrock/us.anthropic.claude-3-5-sonnet-20241022-v2:0
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍
What are the relevant tickets?
Closes https://github.com/mitodl/hq/issues/6622
Description (What does it do?)
ChatLiteLLM
class for all LLM models.How can this be tested?
OPENAI_API_KEY
http://ai.open.odl.local:8003
. Ask some questions, it should work. Look at the logs, you should find this:AI_DEFAULT_RECOMMENDATION_MODEL=openai/o3-mini
to your backend.local.env file and restart containers.